102 research outputs found

    Efficiency of evolutionary algorithms in water network pipe sizing

    Get PDF
    The pipe sizing of water networks via evolutionary algorithms is of great interest because it allows the selection of alternative economical solutions that meet a set of design requirements. However, available evolutionary methods are numerous, and methodologies to compare the performance of these methods beyond obtaining a minimal solution for a given problem are currently lacking. A methodology to compare algorithms based on an efficiency rate (E) is presented here and applied to the pipe-sizing problem of four medium-sized benchmark networks (Hanoi, New York Tunnel, GoYang and R-9 Joao Pessoa). E numerically determines the performance of a given algorithm while also considering the quality of the obtained solution and the required computational effort. From the wide range of available evolutionary algorithms, four algorithms were selected to implement the methodology: a PseudoGenetic Algorithm (PGA), Particle Swarm Optimization (PSO), a Harmony Search and a modified Shuffled Frog Leaping Algorithm (SFLA). After more than 500,000 simulations, a statistical analysis was performed based on the specific parameters each algorithm requires to operate, and finally, E was analyzed for each network and algorithm. The efficiency measure indicated that PGA is the most efficient algorithm for problems of greater complexity and that HS is the most efficient algorithm for less complex problems. However, the main contribution of this work is that the proposed efficiency ratio provides a neutral strategy to compare optimization algorithms and may be useful in the future to select the most appropriate algorithm for different types of optimization problems

    Evolutionary algorithms and other metaheuristics in water resources: Current status, research challenges and future directions

    Get PDF
    Abstract not availableH.R. Maier, Z. Kapelan, Kasprzyk, J. Kollat, L.S. Matott, M.C. Cunha, G.C. Dandy, M.S. Gibbs, E. Keedwell, A. Marchi, A. Ostfeld, D. Savic, D.P. Solomatine, J.A. Vrugt, A.C. Zecchin, B.S. Minsker, E.J. Barbour, G. Kuczera, F. Pasha, A. Castelletti, M. Giuliani, P.M. Ree

    Communicating effectively about CSR on Twitter: The power of engaging strategies and storytelling elements

    Get PDF
    Purpose Corporate social responsibility (CSR) communication is becoming increasingly important for brands and companies. Social media such as Twitter may be platforms particularly suited to this topic, given their ability to foster dialogue and content diffusion. The purpose of this paper is to investigate factors driving the effectiveness of CSR communication on Twitter, with a focus on the communication strategies and elements of storytelling. Design/methodology/approach Using a sample of 281,291 tweets from top global companies in the food sector, automated content analysis (including supervised machine learning) was used to investigate the influence of CSR communication, emotion, and aspirational talk on the likelihood that Twitter users will retweet and like tweets from the companies. Findings The findings highlight the importance of aspirational talk and engaging users in CSR messages. Furthermore, the study revealed that the companies and brands on Twitter that tweeted more frequently about CSR were associated with higher overall levels of content diffusion and endorsement. Originality/value This study provides important insights into key aspects of communicating about CSR issues on social networking sites such as Twitter and makes several practical recommendations for companies

    Addressing model bias and uncertainty in three dimensional groundwater transport forecasts for a physical aquifer experiment

    No full text
    [1] This work contributes a combination of laboratorybased aquifer tracer experimentation and bias-aware Ensemble Kalman Filtering (EnKF) to demonstrate that systematic modeling errors (or bias) in source loading dynamics and the spatial distribution of hydraulic conductivity pose severe challenges for groundwater transport forecasting under uncertainty. The impacts of model bias were evaluated using an ammonium chloride tracer experiment conducted in a three dimensional laboratory tank aquifer with 105 near real-time sampling locations. This study contributes a bias-aware EnKF framework that (i) dramatically enhances the accuracy of concentration breakthrough forecasts in the presence of systematic, spatio-temporally correlated modeling errors, (ii) clarifies in space and time where transport gradients are maximally impacted by model bias, and (iii) expands the size and scope of flow-and-transport problems that can be considered in the future. Citation: Kollat, J. B., P. M. Reed, and D. M. Rizzo (2008), Addressing model bias and uncertainty in three dimensional groundwater transport forecasts for a physical aquifer experiment, Geophys. Res. Lett., 35, L17402

    From maps to movies:high-resolution time-varying sensitivity analysis for spatially distributed watershed models

    No full text
    Distributed watershed models are now widely used in practice to simulate runoff responses at high spatial and temporal resolutions. Counter to this purpose, diagnostic analyses of distributed models currently aggregate performance measures in space and/or time and are thus disconnected from the models' operational and scientific goals. To address this disconnect, this study contributes a novel approach for computing and visualizing time-varying global sensitivity indices for spatially distributed model parameters. The high-resolution model diagnostics employ the method of Morris to identify evolving patterns in dominant model processes at sub-daily timescales over a six-month period. The method is demonstrated on the United States National Weather Service's Hydrology Laboratory Research Distributed Hydrologic Model (HL-RDHM) in the Blue River watershed, Oklahoma, USA. Three hydrologic events are selected from within the six-month period to investigate the patterns in spatiotemporal sensitivities that emerge as a function of forcing patterns as well as wet-to-dry transitions. Events with similar magnitudes and durations exhibit significantly different performance controls in space and time, indicating that the diagnostic inferences drawn from representative events will be heavily biased by the a priori selection of those events. By contrast, this study demonstrates high-resolution time-varying sensitivity analysis, requiring no assumptions regarding representative events and allowing modelers to identify transitions between sets of dominant parameters or processes a posteriori. The proposed approach details the dynamics of parameter sensitivity in nearly continuous time, providing critical diagnostic insights into the underlying model processes driving predictions. Furthermore, the approach offers the potential to identify transition points between dominant parameters and processes in the absence of observations, such as under nonstationarity

    Technical Note:Method of Morris effectively reduces the computational demands of global sensitivity analysis for distributed watershed models

    No full text
    The increase in spatially distributed hydrologic modeling warrants a corresponding increase in diagnostic methods capable of analyzing complex models with large numbers of parameters. Sobol' sensitivity analysis has proven to be a valuable tool for diagnostic analyses of hydrologic models. However, for many spatially distributed models, the Sobol' method requires a prohibitive number of model evaluations to reliably decompose output variance across the full set of parameters. We investigate the potential of the method of Morris, a screening-based sensitivity approach, to provide results sufficiently similar to those of the Sobol' method at a greatly reduced computational expense. The methods are benchmarked on the Hydrology Laboratory Research Distributed Hydrologic Model (HL-RDHM) over a six-month period in the Blue River watershed, Oklahoma, USA. The Sobol' method required over six million model evaluations to ensure reliable sensitivity indices, corresponding to more than 30 000 computing hours and roughly 180 gigabytes of storage space. We find that the method of Morris is able to correctly screen the most and least sensitive parameters with 300 times fewer model evaluations, requiring only 100 computing hours and 1 gigabyte of storage space. The method of Morris proves to be a promising diagnostic approach for global sensitivity analysis of highly parameterized, spatially distributed hydrologic models
    • …
    corecore